Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization

نویسندگان

چکیده

A common approach for minimizing a smooth nonlinear function is to employ finite-difference approximations the gradient. While this can be easily performed when no error present within evaluations, noisy, optimal choice requires information about noise level and higher-order derivatives of function, which often unavailable. Given we propose bisection search finding interval any scheme that balances truncation error, arises from in Taylor series approximation, measurement results evaluation. Our procedure produces reliable estimates at low cost without explicitly approximating derivatives. We show its numerical reliability accuracy on set test problems. When combined with limited memory BFGS, obtain robust method noisy black-box functions, as illustrated subset unconstrained CUTEst problems synthetically added noise.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Noisy Derivative-free Optimization with Value Suppression

Derivative-free optimization has shown advantage in solving sophisticated problems such as policy search, when the environment is noise-free. Many real-world environments are noisy, where solution evaluations are inaccurate due to the noise. Noisy evaluation can badly injure derivative-free optimization, as it may make a worse solution looks better. Sampling is a straightforward way to reduce n...

متن کامل

Randomized Derivative-Free Optimization of Noisy Convex Functions∗

We propose STARS, a randomized derivative-free algorithm for unconstrained optimization when the function evaluations are contaminated with random noise. STARS takes dynamic, noise-adjusted smoothing stepsizes that minimize the least-squares error between the true directional derivative of a noisy function and its finite difference approximation. We provide a convergence rate analysis of STARS ...

متن کامل

Derivative-free optimization methods for finite minimax problems

Derivative-free optimization focuses on designing methods to solve optimization problems without the analytical knowledge of the function. In this paper we consider the problem of designing derivative-free methods for finite minimax problems: minx maxi=1,2,...N{fi(x)}. In order to solve the problem efficiently, we seek to exploit the smooth substructure within the problem. Using ideas developed...

متن کامل

Derivative-Free Optimization for Parameter Estimation in Computational Nuclear Physics

We consider optimization problems that arise when estimating a set of unknown parameters from experimental data, particularly in the context of nuclear density functional theory. We examine the cost of not having derivatives of these functionals with respect to the parameters. We show that the POUNDERS code for local derivative-free optimization obtains consistent solutions on a variety of comp...

متن کامل

Adaptive Noisy Optimization

In this paper, adaptive noisy optimization on variants of the noisy sphere model is considered, i.e. optimization in which the same algorithm is able to adapt to several frameworks, including some for which no bound has never been derived. Incidentally, bounds derived by [16] for noise quickly decreasing to zero around the optimum are extended to the more general case of a positively lower-boun...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Scientific Computing

سال: 2022

ISSN: ['1095-7197', '1064-8275']

DOI: https://doi.org/10.1137/21m1452470